BIGGER STICK, BETTER COMPEIANCE?
I
Bigger Stick, Better Compliance?
Testing Strength of Public Record Statutes on Agency Transparency in the United States
David Cuillier *
University of Arizona
Global Conference on Transparency Research
Rio de Janeiro, Brazil
June 26, 2019
Abstract
This study explores demographic, political, and statutory correlates of state agency compliance
with public record laws by employing a transparency measure derived from a dataset of more
than 7,000 public record requests submitted to state agencies in the United States through the
nonprofit organization MuckRock from 2014 through 2017. Bindings indicate that states with
higher public records request compliance demonstrate less perceived corruption, are more
politically liberal, higher in social capital, and include mandatory attorney fee-shifting provisions
in their state laws. Regression analysis indicates the primary predictor of compliance is political
culture, such that traditionalistic states, primarily in the South, demonstrate lower compliance
with public record laws than moralistic or individualistic states. No relationships were found
between compliance and other statutory provisions, including penalties. Also, no correlations
were found between compliance and previous ratings of state laws and proactive posting of
records online. Implications for future research, assumptions about the effects of public record
laws, and priorities for good governance initiatives are discussed.
Keywords: Ereedom of information, access to public records, government transparency
* Correspondence concerning this article should be addressed to David Cuillier, School
of Journalism, University of Arizona, Tucson, AZ 85721. Contact: cuillier@email.arizona.edu
BIGGER STICK, BETTER COMPEIANCE?
2
Introduction
The ability for citizens to access government information is assumed essential for an
informed electorate to self-govern, and that right is enshrined in the United States and more than
120 other nations through public record laws. Some experts and journalists, however, question
whether public record laws make a difference, or even inhibit access to government information.
They point to long delays, high fees, and low compliance by agencies (EaEleur, 2011; Prime &
Russomanno, 2018; ProPublica, 2016; Wagner, 2017). Journalists often avoid submitting public
record requests, anticipating delays long past their deadlines and a resulting stack of paper
blotted out with black ink (Bluemink & Brush, 2005; Cuillier, 2011; Eeopold, 2015). In the
United States, at the federal level, a requester can expect receiving what was asked for only 1 out
of 5 times (Bridis, 2018). As a result, the number of lawsuits filed against U.S. agencies for lack
of compliance more than doubled from 2008 to 2018, with more than 1,200 cases pending
(Mehta, 2018).
The lack of compliance makes some wonder if statutory provisions for access help or hurt
transparency. Reporters in Eastern Europe, for example, found that newly adopted freedom of
information laws led to more difficulty in acquiring records because government officials created
new bureaucratic barriers and now had legal excuses at hand to deny access to information
(Camaj, 2016). Columbia University legal scholar David Pozen argues that public record laws
are reactionary and contribute to a culture of adversarialism, and in response, he recommends
systematic changes toward affirmative disclosure, rather than reactionary disclosure (Pozen,
2017). Others also have suggested that it might be time to try another approach, such as proactive
dissemination of information online without the need for public record requests (Stewart &
Davis, 2016). “FOIA is irrevocably broken,” they write. “Redrafting FOIA in the digital age, for
the digital age, is the only way to end the constant cycle of non-compliance, delay, frustration
and inadequate legislative revision that has plagued the law since its initial passage.” (p. 536)
Perhaps scrapping current laws, introducing new proactive systems, and implementing
new technological tools are worthy directions to pursue. But before ditching public record laws
altogether, or enacting legislative overhauls, it might behoove policy makers to identify what
elements of the laws produce better compliance and what statutory provisions don’t. Are specific
day deadlines more effective than requiring “promptness”? Do more severe penalties in the law
relate to better compliance? Do specific copy fee provisions result in lower monetary charges to
requesters? Does the perceived strength of laws, as written, even matter?
This study takes a step at peeling back the layers of public record law effectiveness - to
see if and how public record laws and their various permutations might influence, if at all, actual
agency performance in responding to public record requests. This research contributes to our
understanding of government transparency laws by tapping into the compliance results of
thousands of actual state public records requests, submitted over four years by citizens, non¬
profit organizations, and journalists, throughout the United States. The results are compared to
the various permutations of state public record laws to get a sense for what works best and what
does not, and what might be changed, ultimately toward a more transparent and accountable
government.
Literature Review
Access to public records empowers citizens to find out what their government is up to, an
essential element of democratic theory (Blasi, 1977; Meiklejohn, 1961). While some states in the
United States and some nations have enshrined this right in their constitutions, for the most part.
BIGGER STICK, BETTER COMPEIANCE?
3
access rights rely on statutes and case law. U.S. journalists lobbied Congress for passage of the
federal Ereedom of Information Act in 1966 (Schudson, 2015), and since then all of the states
have enacted their own public record laws, as well as more than 120 nations (Global Right to
Information Rating, 2018).
A growing body of research suggests that freedom of information laws result in better
government and better societies. The release of government records appears to result in cleaner
drinking water (Bennear and Olmstead, 2008), greater confidence in the U.S. Social Security
system (Cook, Jacobs, and Kim, 2010), increased levels of transparency and accountability
(Worthy, 2010), and less corruption (Cucciniello, Porumbescu, & Grimmelikhuijsen, 2017;
Eyrio, Eunkes, & Taliani, 2018). Public records are cited in 91 percent of stories submitted to the
Investigative Reporters and Editors annual contest, and 70 percent of those stories result in
substantive change or action (Eanosga & Martin, 2007). Indeed, for every dollar spent on
records-based investigative reporting, society reaps about $300 in benefits (Hamilton, 2016).
Research, however, suggests that implementation of transparency laws can vary by
community and culture (Grimmelikhuijsen et ah, 2013). One study in Elorida found that the
more gender and ethnic diversity in a community the more transparent its government through
proactive posting of government data (Armstrong, 2008). On a global level, studies have found
that culture and other factors are more important than laws for journalists to acquire information
and operate freely (Bertoni, 2012; Eamble, 2004; Relly & Cuillier, 2010; Ricketson & Snell,
2002 ).
Indeed, public record laws can result in “perverse effects” through government agents
engaging in strategic behavior to appear transparent but actually increase secrecy (Williamson &
Eisen, 2016). Eaws can amplify adversarial actions, increasing suspicion, antagonism and
resentment between government employees and journalists (Worthy, 2010). Methods of
bureaucratic resistance include changes in record keeping, manipulation of records, failure to
create records, centralizing information control, and privatizing government services (Roberts,
2006).
Some scholars have attempted to measure the performance of public record laws in the
United States, particularly at the federal level, usually comparing agencies and compliance over
time (Prime & Russomanno, 2018; Wagner, 2017). Eegal scholars have analyzed EOIA
provisions, such as court or executive interpretations of the law (Kirtley, 2015; Pack, 2004). This
is important, but does not allow us to test the effectiveness of different legal provisions, since
these studies focus on one law, U.S. EOIA.
Other scholars have attempted to measure the transparency level of U.S. states and
nations, ranking them from most transparent to least. This is typically done through four methods
1. Legal Analysis
Eor years. Professor Bill Chamberlin led the systematic rating of different legal
provisions in state laws for the Brechner Center for Ereedom of Information at the University of
Elorida (Chamberlin et ah, 2007; Citizen Access Project, 2008). In addition, legal scholars have
compared state statutes and case law for various provisions, including access to government
emails (Senat, 2014; Youm, 2014), penalties (Stewart, 2010; Marzen, 2017), copy fees (Eee,
2016), economic development records (Edmondson & Davis, 2011), and privatization of records
(Bunker & Davis, 1998). Non-profit government accountability groups have attempted to rate the
strength of legal provisions in state laws, as well (Better Governance Association, 2007).
At the global level, Toby Mendel helped develop a rubric to rate various provisions of
national freedom of information laws - the gold standard today in public record law ratings. As
BIGGER STICK, BETTER COMPEIANCE?
4
of 2019, the United States’ FOIA law ranked 69* out of 123 nations, behind Russia, Rwanda,
and Uganda, which highlights the limitations of equating strong laws with actual compliance.
“Anecdotal evidence, along with some high-profile instances of whistle-blowing, notably the
Edward Snowden disclosures, suggests that in practice the United States remains among the
more open countries globally.” (Mendel, 2016, p. 491) Just because a government has a strong
law might not mean it follows the law.
2. Proactive Posting
The second method of evaluating agency transparency has been through measuring
proactive dissemination of public records on government websites. Armstrong (2008) utilized
this method to compare agency transparency with community demographic factors. Non-profit
organizations, particularly those interested in government fiscal transparency, also have
measured the amount of public records provided proactively on their websites (Eollow the
Money, 2018). The limitation of these studies is that they measure just one component of
transparency - proactive posting of information online - which might not be related to
compliance with public record laws. Just because an agency provides information online does
not mean it will respond positively to requesters. The U.S. government, for example, posts
millions of records online through agency websites and data portals, but agencies might be less
able or willing to provide records requested through EOIA.
3. Qualitative Observation
A third approach to assess the level of transparency of agencies is through qualitative
methods, such as in-person observation (Bush Kimball, 2003), interviews (Camaj, 2016), or
surveys of experts and journalists. The Center for Public Integrity (2015), for example, combined
a rating of state law provisions with a survey of experts in each state to develop a ranking of state
transparency. State experts, however, might know their own state agencies’ performances very
well, but could have difficulty comparing to other states.
4. Field Experiments
The fourth method is the measuring of agency transparency through field experiments
and audits. Since the 1990s, journalism and government transparency groups have conducted
access audits in dozens of states, typically sending people out to a variety of agencies to request
records, and then tracking and disseminating the results (National Ereedom of Information
Coalition, 2019). These audits have been helpful in illuminating widespread compliance
problems, but because each audit covers just one state or community, carried out in different
ways, the results cannot be compared across states.
Some scholars have conducted field experiments to identify factors related to better
compliance. Eor example, one study in North Carolina found that peer pressure increases agency
compliance with public records law (ben-Aaron et ah, 2017). Other field experiments have found
compliance to be related to strongly worded request letters (Cuillier, 2010), official letters
instead of informal asks (Worthy, John, & Vannoni, 2016), and higher social clout (Eagunes &
Pocasangre, 2017; Michener & Rodrigues, 2015). These experiments are useful for answering
specific questions, and they are tightly controlled to account for confounding variables, but they
usually do not cover enough jurisdictions that are regulated by different laws to compare public
record law provisions. The other limitation of experiments is that while strong in internal validity,
they lack external validity - the ability to mirror real life as it plays out agency to agency.
All of these studies demonstrate that the rich body of freedom of information research
continues to grow and develop. Up until this point, however, the field has lacked large-scale
datasets across jurisdictions to examine different legal structures. Strength of laws has been
BIGGER STICK, BETTER COMPEIANCE?
5
measured, as well as proactive dissemination of records online by agencies. Experts have been
surveyed, and people have their gut hunches of what works best, and what agencies are best.
Everyone knows Elorida has the best government transparency in the United States, right? Or
does it? And everyone else is quick to blame their own states as being terribly secretive and
backward. But beyond personal anecdotes and word of mouth, how do we really know? Do laws
really make a difference?
This study attempts to test common public record law provisions against actual agency
compliance. Because this study is exploratory, given the first application of this data, it will pose
six research questions.
The first question is whether agency compliance to public records laws is related to
existing measures of government transparency, such as rated strength of the law and the amount
of records provided online. This will help determine whether proactive transparency or strength
of law (or at least how it is currently measured) can predict actual compliance to the law.
Rl: Is compliance with public record laws related to overall perceived strength of the
law and proactive transparency?
The second question seeks to determine whether agency compliance with public record
laws is related to the age of the law. Mendel (2016), in his rating of national laws, notes that
nations that adopted EOIA laws earliest do not have laws as strong as those who have enacted
legislation more recently. Those passing laws today can learn from the mistakes of the past.
Three-quarters of the world’s FOIA laws have been adopted since the year 2000. Some state
public record laws were first enacted in the 1800s, although about half were created since the
1970s. Many underwent major revisions in the past 20 years. Does that mean compliance would
be better for those crafted more recently?
R2: Is compliance with public record laws related to the age of a law - from when it was
first enacted or overhauled?
Strength of penalties for noncompliance might encourage government officials to follow
public record laws. Previous analysis of penalty provisions in state public record laws show that
penalties vary widely, from nothing to jail time and heavy fines (Marzen, 2017; Stewart, 2010).
Previous research in just one state - Arizona - indicated that request letters threatening litigation
and punishment were more effective than friendly or neutral letters (Cuillier, 2010). That might
not translate across the states. Even if a law includes heavy penalties for compliance, it might not
be enforced, thereby reducing its effectiveness.
R3: Is compliance with public record laws related to the strength of penalties in state
public record laws?
Attorney fee-shifting provisions in public record laws are similar to penalties. If a
requester is denied information, he or she may file a lawsuit to compel the agency to provide the
records. Some state public record laws allow judges to award attorney fees to a requester who
prevails in court. These provisions vary widely in the states - some have no provision, some give
judges discretion to award attorney fees, and some require judges to award fees. Agencies may
be more likely to take requests more seriously if they could face paying out hundreds of
thousands of dollars in attorney fees, or face public embarrassment from such a payout.
R4: Is compliance with public record laws related to the strength of attorney fee-shifting
provisions in state public record laws ?
Copy and search fees are common areas of complaints for requesters because they can
dissuade a requester from acquiring records if they must pay hundreds or thousands of dollars.
Some state laws allow just the actual materials of the copy to be charged, such as the cost of a
BIGGER STICK, BETTER COMPEIANCE?
6
piece of paper and copy machine toner. Other states allow search time to be charged, or the time
for an attorney to review for redactions. Perhaps the strength of fee provisions could be related to
lower fees for requesters.
R5: Are copy fees charged to public record requesters related to fee provisions in state
public record laws?
The last research question addresses timeliness and deadline provisions. State public
record laws vary widely on their response requirements for agency officials. Some states have no
deadline, but instead require a “prompt” request. Some states set a deadline of up to three
business days, and many even more, up to the longest deadline of 30 days, under Maryland state
law. The assumption by some is that a specific day deadline is best, rather than leaving it vague.
Others say that when agencies have day deadlines they wait until the last minute to respond,
which could mean a two-week delay in some states.
R6: Are response times to public record requests related to deadline provisions in state
public record laws?
Methodology
To answer the research questions, this study employs a database of thousands of public
records requests submitted by citizens, journalists, researchers, nonprofits and businesses to state
agencies from 2014 through 2017 through MuckRock (www.muckrock.com). This nonprofit
organization based in Boston, Massachusetts, is an online platform that has assisted requesters in
filing more than 63,000 public record request letters since 2010. Users pay $20 for assistance
with four requests, or can pay more for additional service ($40 per month for 20 requests or
organizational accounts for $100 per month). Staff members help draft request letters, identify
the agency contacts and email, fax, mail or upload to agency portals the letters on behalf of the
requesters. Agencies see the identity of the requester and of MuckRock. Eor states that require
residency to submit a public records request, MuckRock sends the request from an in-state
resident. Staff members follow-up on the requests and note the outcome in a database, which
comprise the three main dependent variables for this study: 1) whether the agency provided the
records (compliance), 2) how long the request took in days (timeliness), and 3) copy fees
charged by the agency.
The advantage of this dataset is that it covers the entire United States, including
Washington, D.C. Compliance outcomes can be calculated for each state and then those
outcomes may be compared to the various differences in legal provisions among all 51
jurisdictions. The other advantage of this dataset is that it has high external validity because it
represents actual public record requests from real people to a variety of public agencies. On the
other hand, that also limits the study’s internal validity - a fair amount of “noise” from different
types of records being requested will result in less precision in analysis. The benefit is the
request procedure is the same across all requests through the mediated MuckRock online portal
and staff.
MuckRock provided to the author a database of all requests it processed from its launch
in 2010 to July 2018, totaling 50,433 requests at all levels of government - federal agencies,
state agencies, cities and other local jurisdictions, throughout the United States. The requests in
2010 through 2013 were relatively few, during the early years of the startup’s endeavors, so they
were removed. Also, requests submitted in 2018 were removed because they would have less
time to be processed compared to earlier requests. Entries with incomplete data were removed,
and because this study is focusing on state-level laws, 16,000 federal requests were removed.
BIGGER STICK, BETTER COMPEIANCE?
7
Also, it was decided a priori to remove local jurisdictions for this exploratory study because of
the wide differences in cities within states, which would create additional error, just as
Armstrong (2008) found differences in transparency among communities in Elorida.
Massachusetts and Rhode Island were over-represented in the data, given the early focus of
MuckRock in the Northeast U.S., so some records from those states were randomly removed to
bring the request-per-capita number to the mean of the other states, at about 3 requests per
100,000 people. That left 7,125 requests submitted to state agencies throughout the country, from
24 in Rhode Island to 640 in New York, averaging 140 per state, all relatively distributed equally
on a per-capita basis.
The nature of the requests and requesters over-represent those of individual requesters
looking for records pertaining to their communities, for public-interest research, or a cause,
according to discussions with MuckRock staff. About 40 percent of users of MuckRock are
journalists, which is higher than typical requester composition, usually ranging from 2 to 14
percent depending on the agency (Erequent Eilers 2006; Kwoka 2016; Silver 2016). Other users
of MuckRock include citizen activists, and scholars. Commercial requesters are under¬
represented. An examination of the request types, from simple to complex, and agency types
indicate a broad mix throughout the states, with the exception of Washington, D.C. Because the
District of Columbia is structured similarly to a city, a high percentage of requests (58 percent)
were directed to the police department, which is likely to result in a lower compliance rate
compared to states, which on average, received about 10-15 percent of requests to state law
enforcement agencies. Overall, the data were deemed suitable for exploratory study, to compare
across states and identify potential correlates.
Outcome Variables
Compliance. This variable represents the percentage of MuckRock requests that each
state agency completed during the four-year period, 2014-17. In all, 2,970 requests were noted
by MuckRock as completed, out of the 7,125 requests (42 percent). That did not include requests
that were partially completed (84), withdrawn (729), or in various stages of appeal. This study
acknowledges that some denials by agencies could be warranted - not all information should be
released just because it is requested. The measure gives a sense for the likelihood that a requester
will receive documents when submitting a public records request, in comparison to other states.
The compliance rate ranged from a low of 10 percent in Alabama to a high of 67 percent in
Idaho and Washington state. A complete list by state is provided in Table 1, and a map as Eigure
1, in the appendix.
Timeliness. This measure represented the number of days from when a request was
submitted by MuckRock to when it was completed or closed. The average time nationally for
finishing a request was 59 days.
Copy fees. This measure represented the average price charged per state for copy fees or
search time. The average price nationally for copies, search time, or both, when charged, was
$67 (a few outliers in the millions were removed to avoid skewing the number). However, only
551 requests out of 7,125 (8 percent) encountered a copy fee, which reflects previous studies that
indicate agencies recoup relatively little of the cost of administering public records laws (Wagner,
2017).
An Excel file was created with each row representing a state and the District of Columbia,
or a total of 51 rows. The mean calculation for each of the three dependent variables was entered
into the database for each state, along with the predictor variables.
Legal Predictor Variables
BIGGER STICK, BETTER COMPEIANCE?
8
When possible, predictor variables were selected from the 2014-17 time range, within the
frame of the MuckRock requests. No doubt some state laws changed during that time frame or
after, injecting some additional error into the statistical analysis, but not enough to have a
significant effect on findings.
Law strength. This measure of public records law strength was compiled from the
Brechner Citizen Access Project ratings in 2008, just before the University of Elorida project
ended. Academics, led by Bill Chamberlin, rated various aspects of state public record laws on a
1-7 scale, providing an overall score for each state. While these ratings were completed before
the records were requested by MuckRock, it is assumed that state laws do not change
dramatically and that the overall nature of the law in 2008 would be reflected in performance in
2014-17.
Online transparency. The U.S. Public Interest Research Group (2018) measured the
extent that each state posts spending data online proactively. States were graded A to E by
assessing whether citizens could view online at the state websites spending for governor travel.
Department of Corrections electricity. Department of Agriculture motor fuel, and other
expenditures. The study also gathered other data for their assessment, including from surveys of
state officials.
Perceived transparency. This measure of state transparency was created by the Center
for Public Integrity (2015). Experts and journalists rated the effectiveness of agencies in their
respective states regarding use of exemptions, access to officials’ calendars, timeliness, appeal
procedures, completeness of response, appeal process, punishment for offenders, and willingness
for agencies to provide information in electronic format.
Penalties. This measure was created on a scale of 1 to 5, with “1” representing no
penalties (4 states), “2” a misdemeanor and/or less than $1,000 fine (12), “3” punishment set by
court (20), “4” greater than $1,000 fine (9), and “5” jail time (6). A second dichotomous variable
used for analysis was created for simplicity, of either “1” equaling no penalty or misdemeanor
(36 states) or “2” equaling jail time or a fine greater than $1,000 (15). This scale was created by
the author by reviewing statutes for all the states, as well as research by Marzen (2017), and the
Reporters Committee for Ereedom of the Press Open Government Guide (www.rcfp.org/ogg),
updated in January 2019.
Attorney fee-shifting. This measure represents severity of fee-shifting provisions. A “1”
represents that a judge “may” impose fees if the plaintiff prevailed or substantially prevailed (30
states). A “2” represents a mandatory legal provision that requires judges to impose attorney fees
for the prevailing party (21). This scale was created by the author by analyzing each state’s
statute and case law. When questions arose, the author referred to the Reporters Committee guide
and expert media law attorneys in their respective states.
Copy fees. This was measured by a three-point scale as either “1” for no fees outlined in
law (14 states), “2” for reasonable fees (19), and “3” for fees specifically outlined (18), as per
reading by the author of statute and case law.
Deadlines. This was measured by a three-point scale as “1” for no time deadline (17
states), “2” for 1-5 days (18), or “3” for 6-30 days (16). This was gathered by the author from
statutes and case law.
Law age. This measured the age of each state’s public records law, based on the year of
the most recent major update.
Ombudsman. This variable identified whether a state had a formal ombudsman program
to mediate public record disputes and aid requesters. Ombudsman programs vary widely and are
BIGGER STICK, BETTER COMPEIANCE?
9
believed to provide some benefits for requesters (Stewart, 2009). A “1” indicated existence of an
ombudsman program (39) and a “2” indicated no formal program (12).
Political Predictor Variables
Liberal ideology. This variable was operationalized as the percent of state residents who
said they are liberal (Pew, 2017). Previous research has been mixed on whether liberal ideology
is predictive of support for government transparency, particularly when controlling for education.
Libertarianism. This variable was represented by the percentage of voters in each state
that voted for libertarian candidate Gary Johnson in the 2016 presidential election.
Political engagement. Some research indicates a connection between support for open
records and civic engagement and political involvement (Cuillier, 2008). This was
operationalized by a ranking of the states by political engagement, including the percentage of
registered voters, voter turnout, political contributions, participation in civic groups, and a half
dozen other measures (WalletHub, 2018).
Social capital. Similar to political engagement, social capital measures engagement in
community, and could therefore be predictive of more robust government information sharing.
This measure applied Putnam’s (1998) social capital rating for each state.
Perceptions of corruption. Government transparency scholars have found mixed results
in connecting greater agency openness with lower corruption. This study will use a measure of
perceived corruption, based on surveys of news reporters that cover state politics (Institute for
Corruption Studies, 2018).
Traditionalistic political culture. This was operationalized as “1” for states with a
traditionalistic political culture, as identified by Elazar (1966), and a “2” for states identified as
moralistic or individualistic. Elazar defined traditionalistic cultures as more likely to view
government as necessary to maintain the status quo, and those states that view political
participation as a privilege reserved for those who meet the qualifications. Perhaps state agencies
with such political cultures would be less forthcoming to provide records, particularly to citizens
who would question authority.
Demographic Predictor Variables
Population. State population data collected from the U.S. Census EactEinder, July 1,
2018.
Education. The was operationalized by the average percentage of the population in each
state with a bachelor’s degree or higher, according to the Census EactEinder in 2018.
Income. The average per-capita income in each state, according to the 2018 Census
EactEinder.
Non-white. The percentage of citizens in each state that did not self-identify as white,
according to the 2018 Census EactEinder.
Religiosity. This measure is based on an index of four questions asked of 35,000 U.S.
adults regarding their religious beliefs (Pew, 2016).
Internet connectivity. Previous research indicates that those who use the internet for
information-seeking tend to be more supportive of government transparency (Cuillier &
Piotrowski, 2009). This study will test whether compliance is better in states with internet
connectivity, as measured by the percentage of households with internet subscriptions (National
Center for Education Statistics, 2016).
“Best” states. U.S. News & World Report (2019) combined 70 metrics to create a
ranking of the “Best” states, using measures of health care, education, economy, infrastructure.
BIGGER STICK, BETTER COMPEIANCE?
10
fiscal stability, crime, opportunity, and environment. States in the best shape might have more
bandwidth and resources to support public records dissemination.
Structural pluralism. Armstrong (2008) noted that communities with varied, competing
institutions tended to have governments more likely to proactively post information on their
websites. It is possible records dissemination also could be related to structural pluralism. This
was measured by averaging the z scores for education, income, non-agricultural occupations, the
percentage of professional workers, non-white residents, population, and those not married
(Cronbach’s alpha = .72).
Southern region. States were coded by their U.S. Census designated division (nine
altogether) to examine regional variation. Preliminary analysis indicated two regions standing
out, so a “Southern” variable was created by “1” representing states in the East South Central
division (Alabama, Kentucky, Mississippi, and Tennessee), and the West South Central division
(Arkansas, Louisiana, Oklahoma, and Texas). All other states were coded as “2”.
The variables for each state were entered into a database and analyzed in SPSS. Each
record represented a state, including the District of Columbia, for a total of 51 records containing
the variables of interest. Given the relatively small sample size, it is assumed results will be less
likely to achieve statistical significance in correlational and means analyses, and certainly under
multiple regression analysis. Any statistically significant findings will be considered strong and
findings close to significance will be reported.
Results
Before addressing the six research questions, analyses were conducted to assess the
nature of these three new measures, and to test their external validity. This was done through
evaluation of variables associated with government transparency in previous research, and
evaluation by experts. A ranked list of states with their compliance percentages (see Table 1 in
the appendix), along with a heat map of the nation (Eigure 1, in the appendix), were provided to
experts for their review.
Eirst, it was noted that the overall compliance rate among state agencies for the entire
nation dropped steadily from 2014 through 2017, from 51 percent to 37 percent (Table 2 in the
appendix). No discernible trend could be found in timeliness and copy fees. The three measures
of agency responsiveness - compliance, timeliness, and fees - were not closely related to each
other. A potential relationship was identified between compliance and timeliness (r = .24, p
= .09), but not with copy fees charged (r = .06, p = .70). Copy fees, or the lack of fees, do not
appear to be a reliable predictor of transparency, but timeliness may.
When looking at demographic correlates (see Table 3), some findings support previous
research and the validity of the compliance measure. Eor example, states with a higher
percentage of households that use the internet are more likely to be compliant with public record
laws (r = .46, p < .01). Liberal ideology also was found correlated with better compliance (r
= .38, p < .01), and religiosity negatively correlated (r = -.50, p < .001). Social capital also was
related to compliance (r = .35, p < .05).
Population had no relationship to compliance - states big and small could be compliant or
non-compliant. Some demographic variables, such as education, income and structural pluralism,
did not reach statistical significance, but were close, and given the small sample (50 states and
the District of Columbia), should not be ignored.
One of the most striking demographic variables was Southern region (r = -.55,p< .011).
The central southern states (Alabama, Arkansas, Kentucky, Louisiana, Mississippi, Oklahoma,
BIGGER STICK, BETTER COMPEIANCE?
II
Tennessee, and Texas) averaged 28% compliance, compared to a 44% average for the other
states, and nearly all ranked in the bottom half of the nation with the exception of Texas, which
ranked 24th. This relationship held (p = .07), even when controlling for other factors in multiple
regression analysis, including income, racial diversity, political ideology, and education. Indeed,
in regression analysis, no factor remained related to compliance except the southern region
variable (Table 4).
Turning to the research questions, the first query asked whether compliance with public
record laws would be related to other measures of transparency, such as the perceived strength of
the law or proactive transparency online. Correlational analysis found no relationships, except a
possible connection between compliance and perceived transparency, as measured by surveys of
access experts in the states (r = .20, p = .16), but the correlation is weak, at best.
The second research question asked whether agency compliance with public record laws
is related to the age of the law, from when it was first enacted or overhauled. Again, no strong
correlation was found between the law’s age and compliance. If anything, the data indicate that
states with older laws and less recent updates have stronger compliance (r = .19, p = .18).
The third research question asked whether penalty provisions would be related to better
compliance, particularly states that allow for jail time and steep fines. No relationship was found
(r = -.04, p = .80). It appears punishment outlined in the law has little effect on compliance.
The fourth research question asked whether the strength of attorney fee-shifting
provisions in public record laws would be related to better compliance. A relatively strong
positive and statistically significant correlation was found (r = .30, p < .05). States with no fee-
shifting provisions or discretionary provisions averaged 39% compliance, compared to states
with mandatory fee-shifting provisions, which averaged 45%.
The fifth research question asked whether copy fees charged to requesters would be
related to fee provisions in state public record laws. Analysis found no relationship (r = .08, p
= .57).
The last research question asked whether timely response to public records requests
would be related to deadline provisions in the law. No correlation was found with the scale, but
further analysis did yield an interesting finding. When calculating means for each of the
categories, 1) No deadline, 2) 1-5 days, or 3) 6-30 days, it is apparent that the most effective
legal provision is a requirement to respond within one to five days. That requirement resulted in
a 51-day average response time, compared to 60 days for having no specific deadline and 63
days for having a deadline 6 to 30 days. While analysis of variance did not reveal statistically
significant results E(l, 49) = 1.64, p = .21, the means differences suggest further study,
particularly given the low number of records under analysis. No discemable relation was found
between compliance and states with ombudsman offices.
Summary results were provided to a set of experts from throughout the United States to
review, including 11 leaders of state freedom of information coalitions, 6 freedom of information
scholars, and the co-founder of MuckRock. They were asked if the measure seemed to match
reality, based on their experiences, as well as their impressions with the correlations. Eor the
most part, the experts said the ratings made sense to them, with some caveats. Some thought the
ratings were too high for their states, some too low. Some respondents questioned the validity of
the measure because of the varied records requested throughout the states. While the measure’s
strength is in its external validity, some expressed hope in the ability of gathering data someday
from a large nationwide controlled field experiment. Many were surprised that Elorida did not
rank higher than 3E', and some were surprised by the “Southern effect.” A few thought it might
BIGGER STICK, BETTER COMPEIANCE?
12
be caused by the fact many southern states have a residency requirement for requesters, even if
MuckRock submits requests from in-state citizens. Indeed, a post-hoc analysis found a negative
correlation between compliance and the six states with residency requirements (r = -.35, p < .05),
including Alabama, Arkansas, and Kentucky. However, in regression modeling the Southern
effect stays significant and the residency requirement correlation dissipates.
MuckRock co-founder Michael Morisy and others said they were pleased to see
compliance correlate to less corruption. “It’s a claim we’ve been saying for a long time, but it
was hard to always tell if it was accurate,” Morisy said. “The correlations were not surprising,
but they were comforting.” Many thought penalty provisions should have been more strongly
correlated with the compliance measure, but inferred that because penalties are rarely enforced,
they are not taken seriously by agencies.
Discussion
This exploratory analysis compares the public records compliance of the 50 states and
Washington, D.C., in relation to demographics, political variables, and specific legal provisions,
providing several take-away points and opportunities for further investigation. At minimum, the
results suggest that laws, or at least one provision of the laws, could indeed influence actual
performance by agencies in complying with public record requests.
In general, the findings indicate a promising new method for measuring government
transparency. The compliance measure was found to be correlated with variables typically
associated with open government, such as perceived lower corruption, internet connectivity,
liberal ideology, social capital, and well-developed infrastructure and community affluence (the
“Best” states rating).
Interestingly, little correlation could be found between compliance and previous
measures of government transparency, such as indices measuring the strength of laws or
proactive dissemination of records online. It is likely that some legal provisions make more of a
difference on compliance than others, and attempting to create ratings averaging large numbers
of indicators results in little reflection of reality for the average records requester. Also, rating
agencies on their proactive posting of records online might be helpful, but does not necessarily
mean the agency will be forthcoming with records when asked. Ratings of states based on expert
opinions have potential, as the findings indicate some support, although not statistically
significant.
Eurther, the data indicate that few elements of public record laws seem to have a direct
connection to how well they work for requesters. Penalty provisions seem to have little relation
to compliance, perhaps because they are so rarely enforced (Stewart, 2010). If recalcitrant
officials are never prosecuted or fined, then the law may not be taken seriously by agencies. Also,
copy fee provisions do not appear to be related to actual fees charged, perhaps because fees are
levied arbitrarily and rarely (Eee, 2016). States with ombudsman offices appear to have little
better compliance than states without. Eurther, updated laws do not seem to impact compliance.
If anything, the results suggested states with older laws enjoyed better compliance. Perhaps those
updates were the result pushback against of a culture of secrecy that persists past any
amendments.
The findings, however, indicate that at least one legal provision could be essential for
compliance. In particular, analysis revealed a significant correlation between compliance and
mandatory attorney fee-shifting provisions. States that allow judges broad discretion, or impose
high burdens of success for litigating requesters, demonstrate worst compliance than states that
BIGGER STICK, BETTER COMPEIANCE?
13
mandate judges to impose attorney fees. Certainly, agencies might not worry about a $1,000 fine
or other slap on the wrist, but it appears they pay attention to paying tens if not hundreds of
thousands of dollars to a successful plaintiffs attorney, not to mention the bad publicity that
would create for the agency. If there is one area for state freedom of information coalitions,
journalism organizations and access advocates to focus their lobbying energy, it might be in
enacting mandatory fee-shifting provisions in every state law. Similarly, the data indicate that the
best deadline provisions might be 1-5 days. Anything longer and delays lengthen, and
requirements of being “prompt” appear to result in similar long delays.
Einally, an unexpected finding indicates that southern states, not including Elorida and
others bordering the Atlantic, demonstrate some of the lowest compliance in the country, even
when controlling for key demographic variables. This might suggest that compliance could be
closely tied to political culture, regardless of the law. Perhaps demanding public records could be
deemed impolite in some communities. As noted earlier, culture can be influential when it comes
to freedom of information compliance (Bertoni, 2012; Grimmelikhuijsen et al. 2013; Eamble,
2004; Relly & Cuillier, 2010; Ricketson & Snell, 2002). More research is needed to delve into
what could account for the significantly lower compliance in that region, and how to compensate
for it.
Ultimately, much resources, time and energy in the U.S. advocacy community are
focused on improving public record statutes, proactive dissemination, and litigation. While those
are critical, this study suggests there is another area that is often missed: political culture. We
learned from this study that political culture seems to matter, particularly if it is traditionalistic,
as Elazar (1966) defines. The question is how to change or adapt to that culture, at least in a
public records setting, to facilitate citizen access to government. Perhaps mandatory training for
public employees could help, or the building of strong freedom of information coalitions in states
to foster a culture of openness, or a largescale “Got Info?” public relations campaign.
Limitations
Comparing states is not a simple task, since every state cannot necessarily be labeled as
“transparent” or “secretive.” Unfortunately, the number of states is not large enough for powerful
statistical analysis, but perhaps just large enough for basic exploratory studies such as this. The
correlations were certainly higher than expected, despite the “noise” caused by lack of a
controlled experiment.
The data are relatively new, and no doubt will be refined by MuckRock as the
organization continues to expand and handle more and more requests. As noted earlier, these
requests are over-represented by journalists and citizen activists, which no doubt affected the
ratings. On the one hand, compliance could be higher because they are savvy requesters, and
previous research indicates that journalists are treated better than average citizens (e.g.,
Darbishire & Carson, 2006). However, on the other hand, journalists and activists often pursue
records that agencies don’t want out, particularly those involving law enforcement, and that
could decrease compliance. Despite that limitation, given the large number of requests over four
years and that they were relatively consistent across states, the results were suitable for
exploratory purposes.
Future research
Euture studies should dig deeper into this rich data accumulated by MuckRock, which
grows by 1,500 requests every month. Eor example, this analysis could be conducted at the city
or county levels with a much larger dataset to investigate how state laws might influence
compliance by municipalities. As years of data are collected, some studies can examine the
BIGGER STICK, BETTER COMPEIANCE?
14
effects of amendments on compliance through longitudinal studies. This method could be
replicated and refined elsewhere in the world, particularly with data tracked by Alaveteli, which
has launched similar requester services in 25 countries, facilitating 315,000 requests
(https ://alaveteli.org/).
Eurther research also should focus on specific legal provisions and their effect on
compliance, including experiments, surveys, and interviews. Not every part of a public records
law is created equal, and identifying those elements that have the most impact on the requester,
and therefore society, will help those who wish to improve them.
Conclusion
This study attempted to apply a new measure for freedom of information compliance,
based on thousands of actual public record requests processed throughout the United States.
While not a perfect measure, this MuckRock dataset indicates that public record laws, or at least
parts of them, might have value after all. Despite the valid concerns raised by journalists and
others about widespread agency noncompliance, certain legal elements in this reactionary system
appear correlated with better service for requesters, particularly a specific short deadline (five
days or less) and mandatory attorney fee-shifting provisions. This does not mean that other new
ways of government information dissemination should be ignored, such as through proactive
automatic online posting of government data and new incentives for disclosure. But it does
suggest that nations and states wishing to improve transparency and accountability could do so
by integrating tougher legal provisions and more open political cultures that make a difference.
BIGGER STICK, BETTER COMPEIANCE?
15
References
Armstrong, C. (2008). Exploring a two-dimensional model of community pluralism and its
effects on the level of transparency in community decision making. Journalism and Mass
Communication Quarterly, 55(4), 807-822.
ben-Aaron, J., Denny, M., Desmarals, B., & Wallach, H. (2017). Transparency by conformity: A
field experiment evaluating openness in local governments. Public Administration
Review, 77(1), 68-77.
Bennear, L., & Olmstead, S. (2008). The impacts of the “Right to Know”: Information disclosure
and the violation of drinking water standards. Journal of Environmental Economics and
Management, 56, 117-130.
Bertoni, E. (2012). Ereedom of information: Three harmless words? The role of the media and
access to information laws. Derecho Comparado de la Informacion, 19, 29-94.
Better Government Association (2007). States Tailing EOI Responsiveness, a report by Charles
N. Davis and the National Ereedom of Information Coalition. Available at
https://www.nfoic.org/states-failing-foi-responsiveness (accessed April 1, 2019).
Blasi, V. (1977). The checking value in Eirst Amendment Theory. American Bar Eoundation
Research Journal, 2(3), 521-649.
Bluemink, E., & Brush, M. (2005). A flawed tool: Environmental reporters’ experience with the
Ereedom of Information Act. Society of Environmental Journalists report, available at
http://www.sej.org/foia/SEJ_EOIA_Report2005.pdf (accessed April 1, 2019).
Bridis, T. (2018). U.S. sets new record for censoring, withholding gov’t fdes. The Associated
Press, March 12, 2018. Available at
https://apnews.com/714791d91d7944e49a284a51fab65b85 (accessed April 1, 2019).
Bunker, M., & Davis, C. N. (1998). Privatized government functions and freedom of information:
Public accountability in an age of private governance. Journalism and Mass
Communication Quarterly, 75(3), 464-477.
Bush Kimball, M. (2003). Law enforcement records custodians’ decision-making behaviors in
response to Florida’s public records law. Communication Law and Policy, 8, 313-360.
Camaj, L. (2016). Governments’ uses and misuses of information laws in emerging European
democracies: FOI laws’ impact on news agenda-building in Albania, Kosovo, and
Montenegro. Journalism and Mass Communication Quarterly, 93(4), 923-945.
Center for Public Integrity (2015). How does your state rank for integrity? Available at
https://publicintegrity.org/accountability/how-does-your-state-rank-for-integrity/
(accessed April 1, 2019).
Chamberlin, B., Popescu, C., Weigold, M.P., & Laughner, N. (2007). Searching for patterns in
the laws governing access to records and meetings in the fifty states by using multiple
research tools. University ofPlorida Journal of Law and Public Policy, 18, 415-445.
Citizen Access Project (2008). Do you in the sunshine or the shade? University of Elorida
Brechner Center for Ereedom of Information. Available through the Wayback Machine at
https://web.archive.Org/web/20090402105406/http://www.citizenaccess.org/ (accessed
April 1, 2019).
Cook, E., Jacobs, L., & Kim, D. (2010). Trusting what you know: information, knowledge and
confidence in Social Security. Journal of Politics, 72(2), 397-412.
Cucciniello, M., Porumbescu, G., & Grimmelikhuijsen, S. (2017). 25 years of transparency
research: evidence and future directions. Public Administration Review, 77(1), 32-44.
BIGGER STICK, BETTER COMPEIANCE?
16
Cuillier, D. (2011, May 18-20). U.S. journalists’ use of public records during economic crisis.
Paper presented at the Global Conference on Transparency Research, Newark, NJ.
Cuillier, D. (2010). Honey v. vinegar: Testing compliance-gaining theories in the context of
freedom of information laws. Communication Law and Policy, 15(3), 203-229.
Cuillier, D. (2008). Access attitudes: A social learning approach to examining community
engagement and support for press access to government records. Journalism & Mass
Communication Quarterly, 85(3), 551-578.
Darbishire, H. and Carson, T. (2006). Transparency and silence: a survey of access to
information laws and practices in 14 countries. Open Society Justice Initiative. Available
at https://www.issuelab.org/resources/7736/7736.pdf
Edmondson, A., & Davis, C. N. (2011). Prisoners of private industry: Economic development
and state sunshine laws. Communication Law and Policy, 16, 317-348.
Elazar, D. J. (1966). American federalism: A view from the states. New York: Thomas Y.
Crowell, 228.
Eollow the Money (2018). U.S. Public Interest Research Group report available at
https://uspirgedfund.org/reports/usf/following-money-2018 (accessed April 1, 2019).
Global Right to Information Rating (2018). Access Info Europe and Centre for Eaw and
Democracy. Available at http://www.rti-rating.org/country-data/ (accessed April 1, 2019).
Grimmelikhuijsen, S., Porumbescu, G., Hong, B., & Im, I. (2013). The effect of transparency on
trust in government: A cross-national comparative experiment. Public Administration
Review, 73(A), 575-586.
Mehta, C. (2018). EOIA lawsuits reach record highs in EY 2018. Transactional Records Access
Clearinghouse, Nov. 12, 2019. http://foiaproject.org/2018/ll/12/annual-report-foia-
lawsuits-reach-record-highs-in-fy-2018/ (accessed April 1, 2019).
Hamilton, J. (2016). Democracy’s Detectives: The Economics of Investigative Journalism.
Cambridge: Harvard University Press.
Holsen, S., MacDonald, C., & Glover, M. (2007). Journalists’ use of the UK Freedom of
Information Act. Open Government, 3, 1-16.
Institute for Corruption Studies (2018). Measuring illegal and legal cormption in American states:
Some results from 2018 Cormption in America Survey. Available at
http://greasethewheels.org/cpi/
Kirtley, J. (2015). ‘Misguided in principle and unworkable in practice’: It is time to discard the
Reporters Committee doctrine of practical obscurity (and its evil twin, the right to be
forgotten). Communication Law and Policy, 20(2), 91-115.
Eamble, S. (2004). Media use of freedom of information surveyed: New Zealand puts Australia
and Canada to shame. Freedom of Information Review, 109, 5-8. Available at http://
www.ricksnell.com.au/POI%20Reviews/POT109.pdf (accessed April 1, 2019).
EaPleur, J. (2011). EOIA eyes only: How buried statutes are keeping information secret.
ProPublica (March 14, 2011). Available at https://www.propublica.org/article/foia-
exemptions-sunshine-law (accessed April 1, 2019).
Lagunes, P., & Pocasangre, O. (2017). Dynamic transparency: An audit of Mexico’s Freedom of
Information Act. American Development Bank paper, IDB-WP-836.
Eanosga, G., & Martin, J. (2017). Journalists, sources, and policy outcomes: Insights from three-
plus decades of investigative reporting contest entries. Journalism, 1-18.
BIGGER STICK, BETTER COMPEIANCE?
17
Eee, T. (2016). Public records fees hidden in the law: A study of conflicting judicial approaches
to the determination of the scope of imposable public records fees. Communication Law
and Policy, 21(2), 251-279.
Eeopold, J. (2015). Ensuring agency compliance with the Ereedom of Information Act (EOIA).
House Committee on Oversight and Government Reform, June 2, 2015, available at
https://oversight.house.gov/legislation/hearings/full-committee-hearing-ensuring-agency-
compliance-with-the-freedom-of (accessed April 1, 2019).
Eyrio, M., Eunkes, R., & Taliani, E. (2018). Thirty years of studies on transparency,
accountability, and corruption in the public sector: The state of the art and opportunities
for future research. Public Integrity, 0, 1-22.
Marzen, C. (2017). Public record denials. New York University Journal of Law and Liberty, 11,
966-1027.
Meiklejohn, A. (1961). The Eirst Amendment is an absolute, in The Supreme Court Review, ed.
Philip Kurland. Chicago: University of Chicago Press.
Mendel, T. (2016). The fiftieth anniversary of the Ereedom of Information Act: How it measures
up against international standards and other laws. Communication Law and Policy, 21(4),
465-491.
Michener, G., & Rodrigues, K. (2015). Who wants to know? Assessing discrimination in
transparency and freedom of information regimes. Presented at the 4* Global Conference
on Transparency Studies, Switzerland, June 4-6, 2015.
National Center for Education Statistics (2016). Number and percentage of households with
computer and internet access, by state. Available at
https://nces.ed.gOv/programs/digest/dl7/tables/dtl7_702.60.asp
National Ereedom of Information Coalition (2019). EOI audits. Available at
https://www.nfoic.org/foi-audits (accessed April 1, 2019).
News Media for Open Government (2018). SGI FOIA Storybase. Available at
http://foropengov.org/stories/ (accessed April 1, 2019).
Pack, B. (2004). EOIA frustration: Access to government records under the Bush Administration.
Arizona Law Review, 46, 815-842.
Pew Research Center (Eebruary 29, 2016). How religious is your state? Available at
https://www.pewresearch.org/fact-tank/2016/02/29/how-religious-is-your-
state/?state=alabama
Pozen, D.E. (2017). Ereedom of information beyond the Ereedom of Information Act. University
of Pennsylvania Law Review, 165, 1097-1158.
Prime, T., & Russomanno, J. (2018). The future of EOIA: Course corrections for the digital age.
Communication Law and Policy, 23(3), 267-300.
ProPublica (2016). Delayed, denied, dismissed: Eailures on the EOIA front. Available at
https://www.propublica.org/article/delayed-denied-dismissed-failures-on-the-foia-front
(accessed April 1, 2019).
Putnam, R. D. (1998). Bowling alone: The collapse and revival of American community.
Available at http://bowlingalone.com/?page_id=7
Relly, J. E., & Cuillier, D. (2010). A comparison of political, cultural, and economic indicators
of access to information in Arab and non-Arab states. Government Information Quarterly,
27, 360-370.
BIGGER STICK, BETTER COMPEIANCE?
18
Ricketson, M., & Snell, R. (2002). EOT Threatened by governments, underused by journalists—
Still a sharp tool. In S. Tanners (Ed.), Journalism—Investigation and research (1st ed., pp.
150-164). Sydney, Australia: Eongman.
Roberts, A. (2006). Dashed expectations: Governmental adaptation to transparency rules. In: C.
Hood and D. Heald, ed.. In Transparency: The key to better governance? Oxford: Oxford
University Press.
Schudson, M. (2015). The rise of the right to know: Politics and the culture of transparency,
1945-1975. Cambridge, Mass.: Harvard University Press.
Senat, J. (2014). Whose business is it: Is public business conducted on officials’ personal
electronic devices subject to state open records laws? Communication Law and Policy, 19,
293-326.
Stewart, D. (2010). Let the sunshine in, or else: An examination of the “teeth” of state and
federal open meetings and open records laws. Communication Law and Policy, 15, 265-
310.
Stewart, D. R (2009). Managing conflict over access: A typology of sunshine law dispute
resolution systems. Journal of Media Law & Ethics, 1(112), 49-82.
Stewart, D. R., & Davis, C. N. (2016). Bringing back full disclosure: A call for dismantling
EOIA. Communication Law and Policy, 21(A), 515-537.
U.S. News & World Report (2019). Best states rankings. Available at
https://www.usnews.com/news/best-states/rankings
Wagner, A. J. (2017). Essential or extravagant: Considering EOIA budgets, costs and fees.
Government Information Quarterly, 34(3), 355-366.
WalletHub (2018). Most and least politically engaged states. Available at
https://wallethub.eom/edu/most-least-politically-engaged-states/7782/#methodology
Williamson, V., and Eisen, N. (2016). The impact of open government: Assessing the evidence.
Center for Effective Public Management at Brookings Institution.
BIGGER STICK, BETTER COMPEIANCE?
19
Table 1
State Compliance Rates (in order from best compliance to least)
Rank
State
N
Compliance
(percent)
Avg
Days
Avg
Fee
1
Idaho
75
67
15
$18
2
Washington
180
67
82
$2
3
Nebraska
75
59
27
$38
4
Rhode Island
24
54
8
$125
5
Iowa
93
54
48
$63
6
New York
640
52
79
$2
7
Indiana
116
52
49
$0
8
North Carolina
122
52
71
$6
9
Maryland
84
50
65
$184
10
Arizona
210
49
85
$3
11
Illinois
154
48
46
$0
12
Nevada
99
47
88
$412
13
North Dakota
127
47
32
$227
14
Vermont
127
47
33
$10
15
Wyoming
71
46
47
$12
16
Pennsylvania
168
46
44
$5
17
Colorado
167
46
62
$82
18
Connecticut
176
45
72
$2
19
Ohio
184
45
52
$0
20
Delaware
103
45
64
$11
21
Wisconsin
102
44
49
$8
22
Missouri
148
43
37
$58
23
Montana
109
43
39
$40
24
New Hampshire
90
42
36
$22
25
Texas
326
41
43
$222
26
New Mexico
105
41
79
$15
27
Massachusetts
120
41
40
$303
28
Michigan
136
40
34
$283
29
South Carolina
104
40
59
$56
30
Minnesota
94
39
77
$5
31
Florida
269
39
65
$238
32
Kentucky
89
38
45
$1
33
Alaska
76
38
74
$161
BIGGER STICK, BETTER COMPEIANCE?
20
34
Maine
74
38
39
$70
35
Hawaii
74
36
62
$188
36
Utah
96
36
84
$16
37
California
413
36
60
$35
38
Oregon
91
36
34
$31
39
D.C.
127
36
142
$4
40
Georgia
175
36
54
$188
41
West Virginia
87
36
39
$3
42
South Dakota
82
33
31
$56
43
Louisiana
113
33
92
$199
44
Oklahoma
129
31
103
$2
45
Tennessee
112
30
42
$3
46
Kansas
132
30
50
$32
47
Virginia
175
30
22
$25
48
New Jersey
172
25
66
$0
49
Mississippi
83
22
157
$12
50
Arkansas
98
16
24
$0
51
Alabama
129
10
76
$6
BIGGER STICK, BETTER COMPEIANCE?
21
Eigure I
United States Map of Public Records Compliance
Compliance
22.0
40.5 59-0
Range from 10% to 67%, with dark red indieating lowest eomplianee (primarily states with
traditionalistie political cultures), and yellow highest compliance (states with moralistic and
individualistic political cultures).
BIGGER STICK, BETTER COMPEIANCE?
22
Table 2
National Trends in Compliance, Timeliness, and Copy Fees, 2014-2017
2014
2015
2016
2017
Requests (N)
855
1,590
1,758
2,922
Compliance
50.9%
43.8%
42.7%
37.3%
Avg Days
64
76
60
47
Avg Eee
$20
$100
$69
$61
BIGGER STICK, BETTER COMPEIANCE?
23
Table 3
Correlations for Compliance by Predictor Variables (H = 51)
Predictor variables
Mean
SD
Correlation
Sig.
Demographic variables
Population
.00
.98
Education
.21
.14
Income
.23
.10
Non-white
-.23
.10
Religiosity
-.50***
.000
Internet connectivity
.46**
.001
“Best” states
.005
Structural pluralism (2)
.24
.10
Southern region
-.55***
.000
South
28 %
10.8
Other
44 %
8.8
Political variables
Eiberal ideology
.38**
.007
Eibertarianism
.25
.08
Political engagement
.20
.15
Social capital
.35*
.02
Corruption
-.34*
.02
Traditionalistic culture
-.46**
.001
Legal variables
Eaw Strength
.09
.55
Online Transparency
.13
.36
Perceived Transparency
.20
.16
Eaw age
.19
.18
Copy fees charged
.08
.57
Deadlines
.05
.73
Penalties
-.04
.80
None/misdemeanor
41 %
11.4
Fine or jail time
41 %
9.4
Attorney Eee-Shifting
.30*
.03
Discretionary
39 %
10.3
Mandatory
45 %
10.4
Ombudsman
.14
.33
Yes
44 %
8.0
No
40 %
11.4
*= p < .05; **= p < .01; *** = p< .001
BIGGER STICK, BETTER COMPEIANCE?
24
Table 4
Hierarchical OLS Regressions Predicting Compliance {N =5\)
Variable
B
SEB
P
Block 1: Demographic
Population
.00
.00
.05
Education
.08
1.01
.03
Income
-.25
3.66
-.02
Non-white
-.10
.13
-.13
Religious
-.19
.30
-.19
Southern region
Incremental R^
37%
9.59
5.13
.33 (p = .07)
Block 2: Political
Eiberal
.09
.45
.05
Traditionalistic
Incremental R^
37%
-1.48
4.70
-.06
Block 3: Legal
Penalties
.08
.04
.08
Eee-shifting
Incremental R^
41%
4.00
3.02
.19(p = .19)
Total R2
41%
Total Adjusted R^
26%